4,735 research outputs found
Role of impact ionization in the thermalization of photo-excited Mott insulators
We study the influence of the pulse energy and fluence on the thermalization
of photo-doped Mott insulators. If the Mott gap is smaller than the width of
the Hubbard bands, the kinetic energy of individual carriers can be large
enough to produce doublon-hole pairs via a process analogous to impact
ionization. The thermalization dynamics, which involves an adjustment of the
doublon and hole densities, thus changes as a function of the energy of the
photo-doped carriers and exhibits two timescales -- a fast relaxation related
to impact ionization and a slower timescale associated with higher-order
scattering processes. The slow dynamics depends more strongly on the gap size
and the photo-doping concentration
Chebyshev expansion for Impurity Models using Matrix Product States
We improve a recently developed expansion technique for calculating real
frequency spectral functions of any one-dimensional model with short-range
interactions, by postprocessing computed Chebyshev moments with linear
prediction. This can be achieved at virtually no cost and, in sharp contrast to
existing methods based on the dampening of the moments, improves the spectral
resolution rather than lowering it. We validate the method for the exactly
solvable resonating level model and the single impurity Anderson model. It is
capable of resolving sharp Kondo resonances, as well as peaks within the
Hubbard bands when employed as an impurity solver for dynamical mean-field
theory (DMFT). Our method works at zero temperature and allows for arbitrary
discretization of the bath spectrum. It achieves similar precision as the
dynamical density matrix renormalization group (DDMRG), at lower cost. We also
propose an alternative expansion, of 1-exp(-tau H) instead of the usual H,
which opens the possibility of using established methods for the time evolution
of matrix product states to calculate spectral functions directly.Comment: 13 pages, 9 figure
Efficient DMFT impurity solver using real-time dynamics with Matrix Product States
We propose to calculate spectral functions of quantum impurity models using
the Time Evolving Block Decimation (TEBD) for Matrix Product States. The
resolution of the spectral function is improved by a so-called linear
prediction approach. We apply the method as an impurity solver within the
Dynamical Mean Field Theory (DMFT) for the single- and two-band Hubbard model
on the Bethe lattice. For the single-band model we observe sharp features at
the inner edges of the Hubbard bands. A finite size scaling shows that they
remain present in the thermodynamic limit. We analyze the real time-dependence
of the double occupation after adding a single electron and observe
oscillations at the same energy as the sharp feature in the Hubbard band,
indicating a long-lived coherent superposition of states that correspond to the
Kondo peak and the side peaks. For a two-band Hubbard model we observe an even
richer structure in the Hubbard bands, which cannot be related to a multiplet
structure of the impurity, in addition to sharp excitations at the band edges
of a type similar to the single-band case.Comment: 14 figures, 12 + pages including appendix. New Fig. 4b, Fig. 6,
Fig.10, Fig.11 and Fig.A
An Efficient, Practical Algorithm and Implementation for Computing Multiplicatively Weighted Voronoi Diagrams
We present a simple wavefront-like approach for computing multiplicatively
weighted Voronoi diagrams of points and straight-line segments in the Euclidean
plane. If the input sites may be assumed to be randomly weighted points then
the use of a so-called overlay arrangement [Har-Peled&Raichel, Discrete Comput.
Geom. 53:547-568, 2015] allows to achieve an expected runtime complexity of
, while still maintaining the simplicity of our approach. We
implemented the full algorithm for weighted points as input sites, based on
CGAL. The results of an experimental evaluation of our implementation suggest
as a practical bound on the runtime. Our algorithm can be
extended to handle also additive weights in addition to multiplicative weights,
and it yields a truly simple solution for solving the
one-dimensional version of this problem
Generalized offsetting of planar structures using skeletons
We study different means to extend offsetting based on skeletal structures beyond the well-known constant-radius and mitered offsets supported by Voronoi diagrams and straight skeletons, for which the orthogonal distance of offset elements to their respective input elements is constant and uniform over all input elements. Our main contribution is a new geometric structure, called variable-radius Voronoi diagram, which supports the computation of variable-radius offsets, i.e., offsets whose distance to the input is allowed to vary along the input. We discuss properties of this structure and sketch a prototype implementation that supports the computation of variable-radius offsets based on this new variant of Voronoi diagrams
Reducing the Probability of False Positive Research Findings by Pre-Publication Validation - Experience with a Large Multiple Sclerosis Database
*Objective*
We have assessed the utility of a pre-publication validation policy in reducing the probability of publishing false positive research findings. 
*Study design and setting*
The large database of the Sylvia Lawry Centre for Multiple Sclerosis Research was split in two parts: one for hypothesis generation and a validation part for confirmation of selected results. We present case studies from 5 finalized projects that have used the validation policy and results from a simulation study.
*Results*
In one project, the "relapse and disability" project as described in section II (example 3), findings could not be confirmed in the validation part of the database. The simulation study showed that the percentage of false positive findings can exceed 20% depending on variable selection. 
*Conclusion*
We conclude that the validation policy has prevented the publication of at least one research finding that could not be validated in an independent data set (and probably would have been a "true" false-positive finding) over the past three years, and has led to improved data analysis, statistical programming, and selection of hypotheses. The advantages outweigh the lost statistical power inherent in the process
- …